MSIS Evaluation- Multiple Arcs

This notebook shows GEODYN II output for from three versions of the MSIS models for thermospheric density: - MSIS_86 - MSIS_00 - MSIS_2.0

The run parameters for the GEODYN run demoed here are:

- Satellite:             Starlette
- Observation Datatype:  SLR
- Accel9 Adjustments:    Off
- Density Models:         NRLMSISe86, NRLMSISe00, MSIS 2.0

Data Arcs available for starlette:

- Arc 1 --------  14 days ------ 03/09/14 - 03/09/28
- Arc 2 --------  14 days ------ 03/09/28 - 03/10/12
- Arc 3 --------  14 days ------ 03/10/12 - 03/10/26
- Arc 4 --------  14 days ------ 03/10/26 - 03/11/09
- Arc 5 --------  14 days ------ 03/11/09 - 03/11/23
- Arc 6 --------  14 days ------ 03/11/23 - 03/12/07
- Arc 7 --------  14 days ------ 03/12/07 - 03/12/21
- Arc 8 --------  14 days ------ 03/12/21 - 04/01/5

Analysis

On Starlette’s orbit

  • Starlette is in a somewhat of eccentric orbit: APOGEE =         1110727.924 METERS         PERIGEE=          810723.637 METERS This eccentricity accounts for much of the variation in density seen along a single orbit.

This is averaged over in the orbit averaged plot, which better displays the variation in density due to solar and geomagnetic variation.

  • Starlette by design has a small A/m (area to mass ratio), and a spherical cannonball shape. This makes it ideal for atmospheric drag research.

Relative model performance

  • The results presented here suggest that MSIS2.0 performs better over many arcs

[ ]:

[ ]:

Load the simulations over all arcs

[1]:
import numpy as np
import pandas as pd
from datetime import datetime,timedelta
import os.path
import linecache
from sys import exit
import os
import datetime


def Read_GEODYN_func_MultipleArcs(geodyn_run_params):
    '''
    This function acts as an intermediary to call to the
    primary functions that call the GEODYN Output.
    '''


    #fix naming convention to be downstream functions
    if geodyn_run_params['sat_file'] == 'st':
        sat_name = 'starlette'



    #Inititalize that we will save out data along each arc in arc list
    SatMain_AdjustedParams_arcs = {}
    trajectory_df_arcs = {}
    den_arcs = {}
    resids_observed_arcs = {}
    resids_summary_station_arcs = {}
    resids_meas_summary_arcs = {}
    RunStats_arcs = {}

    # Using the input params, construct the filenames for this arc:
    for arc in geodyn_run_params['arc_list']:

        file_name =  geodyn_run_params['sat_file'] + arc + '.'+ geodyn_run_params['grav_id']
        print('              Loading ', geodyn_run_params['path_to_data'] ,file_name,'\n' ,sep = '')





        # First check if the arc data exists in the place you think it does.
        iieout_file  = geodyn_run_params['path_to_data'] + 'IIEOUT/'+ file_name
        ascii_xyz_file = geodyn_run_params['path_to_data'] + 'XYZ_TRAJ/'+ file_name
        density_file = geodyn_run_params['path_to_data'] + 'DENSITY/'+ file_name




        # Do some pretty printing if it is wanted:
        if geodyn_run_params['verbose_loading'] == True:
            if os.path.isfile(iieout_file) == True:
                print('       File exists: iieout \n              ',iieout_file)
            else:
                print('ERROR: Not the correct path for file: iieout\n',iieout_file )
                exit(1)
            if os.path.isfile(ascii_xyz_file) == True:
                print('       File exists: ascii_xyz \n              ',ascii_xyz_file)
            else:
                print('ERROR: Not the correct path for file: ascii_xyz\n',ascii_xyz_file )
                exit(1)
            if os.path.isfile(density_file) == True:
                print('       File exists: densityfil \n              ',density_file)
            else:
                print('ERROR: Not the correct path for file: densityfil\n',density_file )
                exit(1)
        else:
            print(' ')
        if geodyn_run_params['verbose_loading'] == True:
            print('\n       Loading data... \n')
        else:
            pass






        # ######################################
        # ##    Read Adjusted Parameters
        # ######################################
        import sys
        sys.path.insert(0, '/data/geodyn_proj/analysis/util_funcs/py_read_geodyn_output/')
        from b_ReadGEODYN import Save_AdjustedParameters_geodyn

        SatMain_AdjustedParams_arcs[arc] = Save_AdjustedParameters_geodyn(geodyn_run_params['SAT_ID'],
                                                                iieout_file,
                                                                geodyn_run_params['AccelStatus'],
                                                                geodyn_run_params['DATA_TYPE'])
        if geodyn_run_params['verbose_loading'] == True:
            print('       Parameter adjustment data loaded')
        else:
            pass



        # #####################################
        # #        Read Trajectory
        # #####################################
        from b_ReadGEODYN import read_ascixyz
        trajectory_df_arcs[arc] = read_ascixyz(ascii_xyz_file, geodyn_run_params['YR'] )


        if geodyn_run_params['verbose_loading'] == True:
            print('       Trajectory data loaded')
        else:
            pass




        ######################################
        ##       Read Density Values
        ######################################
        from b_ReadGEODYN import read_density_file
        den_arcs[arc] = read_density_file(density_file, geodyn_run_params['YR'] )



        if geodyn_run_params['verbose_loading'] == True:
            print('       Density data loaded')
        else:
            pass



        ######################################
        ##   -   Read Residual Values   -   ##
        ######################################
        from b_ReadGEODYN import read_observed_resids, read_residual_summarybystation, read_resid_measurement_summaries


        resids_observed_arcs[arc] = read_observed_resids(iieout_file, geodyn_run_params['YR'] , geodyn_run_params['DATA_TYPE'])
        resids_summary_station_arcs[arc] = read_residual_summarybystation(iieout_file)
        resids_meas_summary_arcs[arc] = read_resid_measurement_summaries(iieout_file)



        if geodyn_run_params['verbose_loading'] == True:
            print('       Observation Residuals \n       Summary by Station \n        Measurement Summary data loaded')
        else:
            pass

    ######################################
    ##    -   Read IIEOUT STATS    -    ##
    ######################################
    from b_ReadGEODYN import save_stats_to_dict
    RunStats_arcs[arc] = save_stats_to_dict(iieout_file, geodyn_run_params['den_model'], sat_name, geodyn_run_params['DATA_TYPE'], Verbose_Stats=geodyn_run_params['Verbose_Stats'] )


    #     ######################################
    #     ##       Return collected Datasets
    #     ######################################
    return(SatMain_AdjustedParams_arcs,
            trajectory_df_arcs,
            den_arcs,
            resids_observed_arcs,
            resids_summary_station_arcs,
            resids_meas_summary_arcs,
            RunStats_arcs)

[2]:
%load_ext autoreload
%autoreload 2

# Load multiple arcs:
##################################
# INPUT PARAMETERS:
##################################


arc_list = ['030914_2wk',
            '030928_2wk',
            '031012_2wk',
            '031026_2wk',
            '031109_2wk',
            '031123_2wk',
            '031207_2wk',
#             '031221_2wk',
           ]

base_run_params = {}
base_run_params['arc_list']        = arc_list
base_run_params['sat_file']        = 'st'
base_run_params['SAT_ID']          = 7501001
base_run_params['grav_id']         = 'goco05s'
base_run_params['local_path']      = '/data/geodyn_proj/analysis/starlette_analysis/'
base_run_params['AccelStatus']     = 'acceloff'

base_run_params['den_model']       = 'msis86'

path_to_model = ('/data/geodyn_proj/runs_geodyn/results/'+
                 base_run_params['sat_file'] +'/'+
                 base_run_params['den_model']+'/'+
                 base_run_params['den_model']+'_'+
                 base_run_params['AccelStatus'] +'/')




base_run_params['path_to_data']    = path_to_model
base_run_params['YR']              = 2003
base_run_params['DATA_TYPE']       = 'SLR'
base_run_params['verbose_loading'] = False
base_run_params['Verbose_Stats']   = False



[3]:
geodyn_run_params_m1 = base_run_params

(AdjustedParams_m1,
trajectory_df_m1,
Density_m1,
resids_observed_m1,
resids_summary_station_m1,
resids_meas_summary_m1,
RunStats_m1)  = Read_GEODYN_func_MultipleArcs(geodyn_run_params_m1)

              Loading /data/geodyn_proj/runs_geodyn/results/st/msis86/msis86_acceloff/st030914_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis86/msis86_acceloff/st030928_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis86/msis86_acceloff/st031012_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis86/msis86_acceloff/st031026_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis86/msis86_acceloff/st031109_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis86/msis86_acceloff/st031123_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis86/msis86_acceloff/st031207_2wk.goco05s


[4]:
geodyn_run_params_m2 = base_run_params

geodyn_run_params_m2['den_model']       = 'msis00'
path_to_model = ('/data/geodyn_proj/runs_geodyn/results/'+
                 geodyn_run_params_m2['sat_file'] +'/'+
                 geodyn_run_params_m2['den_model']+'/'+
                 geodyn_run_params_m2['den_model']+'_'+
                 geodyn_run_params_m2['AccelStatus'] +'/')
geodyn_run_params_m2['path_to_data']    = path_to_model


(AdjustedParams_m2,
trajectory_df_m2,
Density_m2,
resids_observed_m2,
resids_summary_station_m2,
resids_meas_summary_m2,
RunStats_m2)  = Read_GEODYN_func_MultipleArcs(geodyn_run_params_m2)

              Loading /data/geodyn_proj/runs_geodyn/results/st/msis00/msis00_acceloff/st030914_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis00/msis00_acceloff/st030928_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis00/msis00_acceloff/st031012_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis00/msis00_acceloff/st031026_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis00/msis00_acceloff/st031109_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis00/msis00_acceloff/st031123_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis00/msis00_acceloff/st031207_2wk.goco05s


[5]:
geodyn_run_params_m3 = base_run_params

geodyn_run_params_m3['den_model']       = 'msis2'
path_to_model = ('/data/geodyn_proj/runs_geodyn/results/'+
                 geodyn_run_params_m3['sat_file'] +'/'+
                 geodyn_run_params_m3['den_model']+'/'+
                 geodyn_run_params_m3['den_model']+'_'+
                 geodyn_run_params_m3['AccelStatus'] +'/')
geodyn_run_params_m3['path_to_data']    = path_to_model


(AdjustedParams_m3,
trajectory_df_m3,
Density_m3,
resids_observed_m3,
resids_summary_station_m3,
resids_meas_summary_m3,
RunStats_m3)  = Read_GEODYN_func_MultipleArcs(geodyn_run_params_m3)

              Loading /data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/st030914_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/st030928_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/st031012_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/st031026_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/st031109_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/st031123_2wk.goco05s


              Loading /data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/st031207_2wk.goco05s


[6]:
geodyn_run_params_m1
[6]:
{'arc_list': ['030914_2wk',
  '030928_2wk',
  '031012_2wk',
  '031026_2wk',
  '031109_2wk',
  '031123_2wk',
  '031207_2wk'],
 'sat_file': 'st',
 'SAT_ID': 7501001,
 'grav_id': 'goco05s',
 'local_path': '/data/geodyn_proj/analysis/starlette_analysis/',
 'AccelStatus': 'acceloff',
 'den_model': 'msis2',
 'path_to_data': '/data/geodyn_proj/runs_geodyn/results/st/msis2/msis2_acceloff/',
 'YR': 2003,
 'DATA_TYPE': 'SLR',
 'verbose_loading': False,
 'Verbose_Stats': False}
[26]:
# initialize plotly:
import plotly.graph_objects as go
import numpy as np
from plotly.offline import plot, iplot
from plotly.subplots import make_subplots
import plotly.express as px
import datetime


col1 = px.colors.qualitative.Plotly[0]
col2 = px.colors.qualitative.Plotly[1]
col3 = px.colors.qualitative.Plotly[2]



def add_arc_background_w_text(y_vals):
    '''Define the arc parameters for this run to be plotted as background '''

    #### Arc Background + Label ####
    fig.add_vrect(
        x0="2003-09-14", x1="2003-09-28",
        fillcolor="LightSkyBlue", opacity=0.3,
        layer="below", line_width=0,
    )

    # Create scatter trace of text labels
    fig.add_trace(go.Scatter(
        x=["2003-09-20"],
        y=[y_vals],
        text=["Arc 1",
             ],
        mode="text",
        showlegend=False,

    ))


    #### Arc Background + Label ####
    fig.add_vrect(
        x0="2003-09-28", x1="2003-10-12",
        fillcolor="LightSlateGrey", opacity=0.3,
        layer="below", line_width=0,
    )
    # Create scatter trace of text labels
    fig.add_trace(go.Scatter(
        x=["2003-10-4"],
        y=[y_vals],
        text=["Arc 2",
             ],
        mode="text",
        showlegend=False,
    ))

    # '031012_2wk', '031026_2wk'
    #### Arc Background + Label ####
    fig.add_vrect(
        x0="2003-10-12", x1="2003-10-26",
        fillcolor="LightSkyBlue", opacity=0.3,
        layer="below", line_width=0,
    )
    # Create scatter trace of text labels
    fig.add_trace(go.Scatter(
        x=["2003-10-18"],
        y=[y_vals],
        text=["Arc 3",
             ],
        mode="text",
        showlegend=False,
    ))

    # '031012_2wk', '031026_2wk'
    #### Arc Background + Label ####
    fig.add_vrect(
        x0="2003-10-26", x1="2003-11-09",
        fillcolor="LightSlateGrey", opacity=0.3,
        layer="below", line_width=0,
    )
    # Create scatter trace of text labels
    fig.add_trace(go.Scatter(
        x=["2003-11-03"],
        y=[y_vals],
        text=["Arc 4",
             ],
        mode="text",
        showlegend=False,
    ))

#  - Arc 2 --------  14 days ------ 03/09/28 - 03/10/12
#  - Arc 3 --------  14 days ------ 03/10/12 - 03/10/26
#  - Arc 4 --------  14 days ------ 03/10/26 - 03/11/09
#  - Arc 5 --------  14 days ------ 03/11/09 - 03/11/23
#  - Arc 6 --------  14 days ------ 03/11/23 - 03/12/07
#  - Arc 7 --------  14 days ------ 03/12/07 - 03/12/21
#  - Arc 8 --------  14 days ------ 03/12/21 - 04/01/5


    fig.add_vrect(
    x0="2003-11-09", x1="2003-11-23",
    fillcolor="LightSkyBlue", opacity=0.3,
    layer="below", line_width=0,
    )

    fig.add_vrect(
    x0="2003-11-23", x1="2003-12-07",
    fillcolor="LightSlateGrey", opacity=0.3,
    layer="below", line_width=0,
    )

    fig.add_vrect(
    x0="2003-12-07", x1="2003-12-21",
    fillcolor="LightSkyBlue", opacity=0.3,
    layer="below", line_width=0,
    )
    return(fig)




def legend_as_annotation():
    fig.add_annotation(
            x=1.1,
            y=.9,
            xref="x domain",
            yref="y domain",
            showarrow=False,
            text="MSISe86",
            font=dict(
                size=16,
                color="#ffffff"
                ),
            align="center",
            bordercolor="#c7c7c7",
            borderwidth=2,
            borderpad=4,
            bgcolor=px.colors.qualitative.Plotly[0],
            opacity=0.9
            )

    fig.add_annotation(
            x=1.1,
            y=.8,
            xref="x domain",
            yref="y domain",
            showarrow=False,
            text="MSISe00",
            font=dict(
                size=16,
                color="#ffffff"
                ),
            align="center",
            bordercolor="#c7c7c7",
            borderwidth=2,
            borderpad=4,
            bgcolor=px.colors.qualitative.Plotly[1],
            opacity=0.9
            )

    fig.add_annotation(
        x=1.1,
        y=.7,
        xref="x domain",
        yref="y domain",
        showarrow=False,
        text="MSISe2",
        font=dict(
            size=16,
            color="#ffffff"
            ),
        align="center",
        bordercolor="#c7c7c7",
        borderwidth=2,
        borderpad=4,
        bgcolor=px.colors.qualitative.Plotly[2],
        opacity=0.9
        )
    return(fig)

Coefficient of Drag Adjustments

Details and Background:

  • the \(C_d\) is an adjusted parameter in GEODYN. Each iteration, it is estimated and adjusted to improve the OD simulation

  • \(C_d\) is treated as a constant in GEODYN (for any iteration). The factor \(C_d\) varies slightly with satellite shape and atmospheric composition. However, for any geodetically useful satellite, it may be treated as a satellite dependent constant. If the Cd is time dependent, then it is constant over the designated time intervals.

  • $ \bar`{A}_D = -:nbsphinx-math:frac{1}{2}` C_d \frac{A_s}{m_s} \rho_d v_r :nbsphinx-math:`bar`{v}_r $

    • \(A_D\) acceleration due to atmospheric drag force

    • \(C_d\) is the satellite drag coefficient

    • \(A_s\) is the cross-sectional area of the satellite

    • \(m_s\) is the mass of the satellite

    • \(\rho_d\)is the-density of the atmosphere at the satellite position

    • \(\bar{v}_r\) is the velocity vector of the satellite relative to the atmosphere, and

    • \(v_r\) is the magnitude of the velocity vector, vr .

Adjusted Cd Plot (timeseries)

[29]:
SAT_ID = geodyn_run_params_m1['SAT_ID']

which_stat = 'CURRENT_VALUE' # 2 is the current val



# make plot:
fig = make_subplots(
    rows=1, cols=1,)

for ii,arc in enumerate(geodyn_run_params_m1['arc_list']):
    i_arc = ii+1
    last_iter = list(AdjustedParams_m1[arc].keys())[-1]
    labels = list(AdjustedParams_m1[arc][last_iter][SAT_ID]['0CD'].keys())
#     print(last_iter)
#     val_list = []
#     for i in AdjustedParams_m1[arc][last_iter][SAT_ID]['0CD'].keys():
#         val_list.append(AdjustedParams_m1[arc][last_iter][SAT_ID]['0CD'][i][which_stat])
#     labels = list(AdjustedParams_m1[arc][last_iter][SAT_ID]['0CD'].keys())


    val_list_1 = []
    for i in AdjustedParams_m1[arc][last_iter][SAT_ID]['0CD'].keys():
        val_list_1.append(AdjustedParams_m1[arc][last_iter][SAT_ID]['0CD'][i][which_stat])
    fig.add_trace(go.Scatter(x=labels,
                                     y=val_list_1,
                                    name= geodyn_run_params_m1['den_model'] + ' |  Arc ' +str(i_arc) +' | Iters: '+ str(last_iter),
                                     mode='markers',
                                     marker=dict(
                                      color=px.colors.qualitative.Plotly[0],
                                     size=7,),
                                     showlegend=False,
                                     ),
                                     row=1, col=1,
                                     )
    last_iter = list(AdjustedParams_m2[arc].keys())[-1]
    val_list_2 = []
    for i in AdjustedParams_m2[arc][last_iter][SAT_ID]['0CD'].keys():
        val_list_2.append(AdjustedParams_m2[arc][last_iter][SAT_ID]['0CD'][i][which_stat])
    fig.add_trace(go.Scatter(x=labels,
                                     y=val_list_2,
                                    name=geodyn_run_params_m2['den_model'] + ' |  Arc ' +str(i_arc) +' | Iters: '+ str(last_iter),
                                     mode='markers',
                                     marker=dict(
                                     color=px.colors.qualitative.Plotly[1],
                                     symbol = 'circle',
                                     size=7,),
                                     showlegend=False,
                                     ),
                                     row=1, col=1,
                                     )
    last_iter = list(AdjustedParams_m3[arc].keys())[-1]
    val_list_3 = []
    for i in AdjustedParams_m3[arc][last_iter][SAT_ID]['0CD'].keys():
        val_list_3.append(AdjustedParams_m3[arc][last_iter][SAT_ID]['0CD'][i][which_stat])
    fig.add_trace(go.Scatter(x=labels,
                                     y=val_list_3,
                                    name= geodyn_run_params_m3['den_model'] + ' | Arc ' +str(i_arc) +' | Iters: '+ str(last_iter),
                                     mode='markers',
                                     marker=dict(
                                     color=px.colors.qualitative.Plotly[2],
                                     symbol = 'circle',
                                     size=7,),
                                     showlegend=False,
                                     ),
                                     row=1, col=1,
                                     )

fig = add_arc_background_w_text(9.5)
fig = legend_as_annotation()


# fix layout info:
fig.update_yaxes( title="Cd ",exponentformat= 'power',row=1, col=1)
fig.update_xaxes( title="Date", row=1, col=1)
fig.update_layout(title="Time Dependent Drag Coefficient ")
fig.update_layout(
    font=dict(
        size=14,
             ),)
# fig.update_layout(
#     autosize=False,
#     width=600,
#     height=500,)

iplot(fig)






Cd Analysis

A few key features can be seen in the above plot: 1. When there is some sort of storm event early in the arc, the Cd responds by correcting the mean density provided by the different models. 2. The MSISe00 corrections are less at this set of altitudes as opposed to those of m86 and m2.0. 3. Overall, though from a POD perspective, they seem to do a good job accounting for density variations. - Frank states that the onus is to show there is some real storm event going on over these several days.

TODO: add storm event from the Kp/Ap and F10.7 timeseries.

[ ]:

Density Output

  • The density output is sourced from the DRAG.f90 subroutine.

[32]:
data_nums_1 = 1000
data_nums_2 = 100


fig = make_subplots(rows=1, cols=1,
        subplot_titles=(["Every 100th point"]),
                       )

for arc in base_run_params['arc_list']:

    fig.add_trace(go.Scatter(x=Density_m1[arc]['Date'][::data_nums_2],
                             y=Density_m1[arc]['rho (kg/m**3)'][::data_nums_2],
                             name = geodyn_run_params_m1['den_model'],
                             mode='markers',
                                marker=dict(color=col1,
                                size=3,
                                ),
                              showlegend=False,
                               ),
                               row=1, col=1,
                               )

    fig.add_trace(go.Scatter(x=Density_m2[arc]['Date'][::data_nums_2],
                         y=Density_m2[arc]['rho (kg/m**3)'][::data_nums_2],
                         name = geodyn_run_params_m2['den_model'],
                         mode='markers',
                            marker=dict(color=col2,
                            size=3,
                            ),
                          showlegend=False,
                           ),
                           row=1, col=1,
                           )
    fig.add_trace(go.Scatter(x=Density_m3[arc]['Date'][::data_nums_2],
                         y=Density_m3[arc]['rho (kg/m**3)'][::data_nums_2],
                         name = geodyn_run_params_m3['den_model'],
                         mode='markers',
                            marker=dict(color=col3,
                            size=3,
                            ),
                          showlegend=False,
                           ),
                           row=1, col=1,
                           )

fig.update_layout(
    title="Density along Starlette Orbit",
    )
fig.update_yaxes(type="log", exponentformat= 'power',)
fig.update_layout(legend= {'itemsizing': 'constant'})
fig.update_yaxes(title_text="kg/m^3", row=1, col=1)
fig.update_xaxes(title_text="Date", row=1, col=1)
fig.update_layout(legend= {'itemsizing': 'constant'})


fig = add_arc_background_w_text(9.8e-14)
fig = legend_as_annotation()

fig.update_layout(
    autosize=False,
    width=900,
    height=700,
)
fig.update_layout(
    font=dict(
        size=18,
             ),)

iplot(fig)

[34]:


fig = make_subplots(rows=1, cols=1,
        subplot_titles=(["Orbit averaged Density"]),
                       )
#################### MSIS86
for i_arc, arc in enumerate(base_run_params['arc_list']):
    lat = np.asarray(Density_m1[arc]['Lat'])
    time_pd = pd.to_datetime(Density_m1[arc]['Date'])
    i = np.nonzero( lat[1:]*lat[0:-1]  <  np.logical_and(0 , lat[1:] > lat[0:-1] )  )
    i = i[0]

    d_avg = np.zeros(np.size(i))
    height_avg = np.zeros(np.size(i))
    time_avg = []

    d_avg_roll = []

    roll_avg_count = 0
    for j in range(np.size(i)-1):
    #     print(j+1)
        d_avg[j] = np.mean(  Density_m1[arc]['rho (kg/m**3)'][   i[j] : i[j+1]-1  ]  )
        height_avg[j] = np.mean(  Density_m1[arc]['Height (meters)'][   i[j] : i[j+1]-1  ]  )
        mean_time=time_pd[   i[j] : i[j+1]-1  ].mean()
        time_avg.append(  mean_time)

        if roll_avg_count ==2:
            d_avg_roll.append(np.mean([ d_avg[j],  d_avg[j-1]]))

            roll_avg_count =0

        roll_avg_count+=1

    fig.add_trace(go.Scatter(x=time_avg[::2],
                             y=d_avg_roll,
                            name= ' Arc ' +str(i_arc) ,
                             mode='markers',
                             marker=dict(
                             color=col1,
                             size=3,),
                             showlegend=False,
                               ),
                               row=1, col=1,
                               )
#################### MSIS00

    lat = np.asarray(Density_m2[arc]['Lat'])
    time_pd = pd.to_datetime(Density_m2[arc]['Date'])
    i = np.nonzero( lat[1:]*lat[0:-1]  <  np.logical_and(0 , lat[1:] > lat[0:-1] )  )
    i = i[0]

    d_avg = np.zeros(np.size(i))
    height_avg = np.zeros(np.size(i))
    time_avg = []

    d_avg_roll = []

    roll_avg_count = 0
    for j in range(np.size(i)-1):
    #     print(j+1)
        d_avg[j] = np.mean(  Density_m2[arc]['rho (kg/m**3)'][   i[j] : i[j+1]-1  ]  )
        height_avg[j] = np.mean(  Density_m2[arc]['Height (meters)'][   i[j] : i[j+1]-1  ]  )
        mean_time=time_pd[   i[j] : i[j+1]-1  ].mean()
        time_avg.append(  mean_time)

        if roll_avg_count ==2:
            d_avg_roll.append(np.mean([ d_avg[j],  d_avg[j-1]]))

            roll_avg_count =0

        roll_avg_count+=1

    fig.add_trace(go.Scatter(x=time_avg[::2],
                             y=d_avg_roll,
                            name= ' Arc ' +str(i_arc) ,
                             mode='markers',
                             marker=dict(
                             color=col2,
                             size=3,),
                             showlegend=False,
                               ),
                               row=1, col=1,
                               )


#################### MSISe2

    lat = np.asarray(Density_m3[arc]['Lat'])
    time_pd = pd.to_datetime(Density_m3[arc]['Date'])
    i = np.nonzero( lat[1:]*lat[0:-1]  <  np.logical_and(0 , lat[1:] > lat[0:-1] )  )
    i = i[0]

    d_avg = np.zeros(np.size(i))
    height_avg = np.zeros(np.size(i))
    time_avg = []

    d_avg_roll = []

    roll_avg_count = 0
    for j in range(np.size(i)-1):
    #     print(j+1)
        d_avg[j] = np.mean(  Density_m3[arc]['rho (kg/m**3)'][   i[j] : i[j+1]-1  ]  )
        height_avg[j] = np.mean(  Density_m3[arc]['Height (meters)'][   i[j] : i[j+1]-1  ]  )
        mean_time=time_pd[   i[j] : i[j+1]-1  ].mean()
        time_avg.append(  mean_time)

        if roll_avg_count ==2:
            d_avg_roll.append(np.mean([ d_avg[j],  d_avg[j-1]]))

            roll_avg_count =0

        roll_avg_count+=1

    fig.add_trace(go.Scatter(x=time_avg[::2],
                             y=d_avg_roll,
                            name= ' Arc ' +str(i_arc) ,
                             mode='markers',
                             marker=dict(
                             color=col3,
                             size=3,),
                             showlegend=False,
                               ),
                               row=1, col=1,
                               )

fig = add_arc_background_w_text(4.1e-14)
fig = legend_as_annotation()

fig.update_layout(
    title="Density along Starlette Orbit",
    )
fig.update_yaxes(type="log", exponentformat= 'power',)
fig.update_layout(legend= {'itemsizing': 'constant'})
fig.update_yaxes(title_text="kg/m^3", row=1, col=1)
fig.update_xaxes(title_text="Date", row=1, col=1)
fig.update_layout(legend= {'itemsizing': 'constant'})


fig.update_layout(
    autosize=False,
    width=900,
    height=700,
)
fig.update_layout(
    font=dict(
        size=18,
             ),)

iplot(fig)

Density output Analysis

  • Starlette is in somewhat of an eccentric orbit, roughly 800 km x 1100 km orbit - so that accounts for the change in density along the orbit.

  • while this is helpful to see, one really needs to demonstrate these results over multiple arcs.

  • The relative densities are different at Apogee and perigee.

    • Apogee : MSISe86 and MSISe00 have the higher densities

    • Perigee : MSIS2.0 has the higher density

TODO: run this over multiple arcs and add an orbit averaged plot

[35]:
f107a =[]
f107d =[]
date = []
ap = []

import pandas as pd
for i,arc in enumerate(base_run_params['arc_list']):
    print(i,arc)
    msisin_den_file = geodyn_run_params_m3['path_to_data'] + "DENSITY/st"+arc + ".goco05s_msisin"
    DEN1_csv = pd.read_csv(msisin_den_file,
                        skiprows = 1,
                        names = ['IYYDDD',
                                 'IYR',
                                  'DAY',
                                 'UTSEC',
                                 'ALTKM',
                                 'GLAT',
                                 'GLON',
                                 'STLOC',
                                 'AVGFLX',
                                 'FLUX',
                                 'AP1',
                                 'AP2',
                                 'AP3',
                                 'AP4',
                                 'AP5',
                                 'AP6',
                                 'AP7',
                                ],
                        sep = '\s+',
                        )
    DEN1_csv['Date'] = (pd.to_datetime('0'+ ((DEN1_csv['IYR'].astype(int).astype(str))),  format='%y')
                    +  pd.to_timedelta(DEN1_csv['DAY'], unit='days'))

    f107a.append(DEN1_csv['AVGFLX'].values)
    f107d.append(DEN1_csv['FLUX'].values)
    date.append(DEN1_csv['Date'].values)
    ap.append(DEN1_csv['AP7'].values)
0 030914_2wk
1 030928_2wk
2 031012_2wk
3 031026_2wk
4 031109_2wk
5 031123_2wk
6 031207_2wk
[ ]:

[37]:

fig = make_subplots(rows=2, cols=1,
        subplot_titles=(['Ap','F10.7']),
                       )



for i,arc in enumerate(base_run_params['arc_list']):
    i_arc = i+1

    fig.add_trace(go.Scatter(x=pd.to_datetime(date[i][::500]),
                             y=ap[i][::500],
                             name= 'Arc :'+str(i_arc),
                             mode='markers',
                             marker=dict(color = px.colors.qualitative.Plotly[i],
                             size=3,),
                               ),
                               row=1, col=1,
                               )
    fig.add_trace(go.Scatter(x=pd.to_datetime(date[i][::1000]),
                         y=f107d[i][::1000],
#                          name= 'Ap',
                         mode='markers',
                         marker=dict(color = px.colors.qualitative.Plotly[i],
                         size=3,),
                          showlegend=False,

                               ),
                               row=2, col=1,
                               )
fig = add_arc_background_w_text(280)

fig.update_layout(
    title="Geomagnetic and Solar Activity",
    )

fig.update_yaxes(title_text="nT", row=1, col=1)
fig.update_yaxes(title_text="sfu", row=2, col=1)

fig.update_xaxes(title_text="Date", row=2, col=1)
fig.update_layout(legend= {'itemsizing': 'constant'})

fig.update_layout(legend= {'itemsizing': 'constant'})
fig.update_layout(
    font=dict(
        size=17,
             ),)

fig.update_layout(
    autosize=False,
    width=900,
    height=700,
)
iplot(fig)
[38]:
fig = go.Figure()

for i,arc in enumerate(base_run_params['arc_list']):
    i_arc = i+1

    fig.add_trace(go.Scatter(x=resids_observed_m1[arc]['Date'],
                             y=resids_observed_m1[arc]['Residual'].values.astype(float),
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col1,
                             size=3,),
                             showlegend=False,
                             ),
                             )

    fig.add_trace(go.Scatter(x=resids_observed_m2[arc]['Date'],
                             y=resids_observed_m2[arc]['Residual'].values.astype(float),
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col2,
                             size=3,),
                             showlegend=False,
                             ),
                             )


    fig.add_trace(go.Scatter(x=resids_observed_m3[arc]['Date'],
                             y=resids_observed_m3[arc]['Residual'].values.astype(float),
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col3,
                             size=3,),
                             showlegend=False,
                             ),
                             )


fig = add_arc_background_w_text(np.max(resids_observed_m1[arc]['Residual'].values.astype(float)))
fig = legend_as_annotation()

fig.update_layout(
    title='Observation Residuals',
    yaxis_title="Residuals",
    xaxis_title="Date",
    )
fig.update_layout(legend= {'itemsizing': 'constant'})
fig.update_layout(
    font=dict(
        size=17,
             ),)

iplot(fig)
[ ]:

[ ]:

[ ]:

[ ]:

Residuals – obervations

Details:

  • the residuals being output by geodyn should be rescaled to CM instead of METER. Most of your stations have cm level (or better) data precision.

  • The mean in the residuals (or the mean by station) is not important - it is very small in these runs - unless you turn out to have a station with a large systematic range bias - due to some temporary anomaly.

    • If you look at the setups, you’ll see sets of “MBIAS” cards – these adjust different sets of range biases by station per arc or by station per data pass based on recommendations from the ILRS. They are adjusted as a normal part of the POD process.

[ ]:

[ ]:

[ ]:

[39]:
# ResidsMeasSumm_m1 = resids_meas_summary_arcs
mark_size = 10

fig = make_subplots(rows=1, cols=1,
                       )

for i,arc in enumerate(geodyn_run_params_m1['arc_list']):
    i_arc = i+1
    arc_val = arc[:6]
    arc_date = np.datetime64(datetime.datetime.strptime(arc_val, '%y%m%d'))
    iter_index =  int(resids_meas_summary_m1[arc]['Iter'].values[-1])-1

    mean_arc = resids_meas_summary_m1[arc]['MEAN'].values.astype(float)[iter_index]
#     rms_arc = ResidsMeasSumm_m1[arc]['RMS'][iter_index].values.astype(float)

    fig.add_trace(go.Scatter(x=[arc_date],
                             y=[mean_arc],
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col1,
                             size=mark_size,),
                             showlegend=False,
                             ),
                              row=1, col=1,
                             )
for i,arc in enumerate(geodyn_run_params_m2['arc_list']):
    i_arc = i+1
    arc_val = arc[:6]
    arc_date = np.datetime64(datetime.datetime.strptime(arc_val, '%y%m%d'))
    iter_index =  int(resids_meas_summary_m2[arc]['Iter'].values[-1])-1
    mean_arc = resids_meas_summary_m2[arc]['MEAN'].values.astype(float)[iter_index]
#     rms_arc = ResidsMeasSumm_m1[arc]['RMS'][iter_index].values.astype(float)

    fig.add_trace(go.Scatter(x=[arc_date],
                             y=[mean_arc],
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col2,
                             size=mark_size,),
                             showlegend=False,
                             ),
                              row=1, col=1,
                             )

for i,arc in enumerate(geodyn_run_params_m3['arc_list']):
    i_arc = i+1
    arc_val = arc[:6]
    arc_date = np.datetime64(datetime.datetime.strptime(arc_val, '%y%m%d'))
    iter_index =  int(resids_meas_summary_m3[arc]['Iter'].values[-1])-1
    mean_arc = resids_meas_summary_m3[arc]['MEAN'].values.astype(float)[iter_index]
#     rms_arc = ResidsMeasSumm_m1[arc]['RMS'][iter_index].values.astype(float)

    fig.add_trace(go.Scatter(x=[arc_date],
                             y=[mean_arc],
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col3,
                             size=mark_size,),
                             showlegend=False,
                             ),
                              row=1, col=1,
                             )


fig = add_arc_background_w_text(.002)
fig = legend_as_annotation()

fig.update_layout(
    title="Mean- Residual Measurement Summary",
    )
fig.update_yaxes(title_text="Residual", row=1, col=1)
#     fig.update_yaxes(title_text="RMS", row=2, col=1)
fig.update_xaxes(title_text="Arc", row=1, col=1)
#     fig.update_xaxes(title_text="Arc", row=2, col=1)
fig.update_layout(legend= {'itemsizing': 'constant'})
fig.update_xaxes(tickangle=0)

fig.update_layout(
    font=dict(
        size=15,
             ),)


fig.show()

Analysis

  • You see the bowtie effect in the residuals above. That is a standard effect of least squares - where fits seem the best in the center of the arc.

[40]:
# ResidsMeasSumm_m1 = resids_meas_summary_arcs
mark_size = 10

fig = make_subplots(rows=1, cols=1,
                       )

for i,arc in enumerate(geodyn_run_params_m1['arc_list']):
    i_arc = i+1
    arc_val = arc[:6]
    arc_date = np.datetime64(datetime.datetime.strptime(arc_val, '%y%m%d'))
    iter_index =  int(resids_meas_summary_m1[arc]['Iter'].values[-1])-1

    mean_arc = resids_meas_summary_m1[arc]['RMS'].values.astype(float)[iter_index]
#     rms_arc = ResidsMeasSumm_m1[arc]['RMS'][iter_index].values.astype(float)

    fig.add_trace(go.Scatter(x=[arc_date],
                             y=[mean_arc],
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col1,
                             size=mark_size,),
                             showlegend=False,
                             ),
                              row=1, col=1,
                             )
for i,arc in enumerate(geodyn_run_params_m2['arc_list']):
    i_arc = i+1
    arc_val = arc[:6]
    arc_date = np.datetime64(datetime.datetime.strptime(arc_val, '%y%m%d'))
    iter_index =  int(resids_meas_summary_m2[arc]['Iter'].values[-1])-1
    mean_arc = resids_meas_summary_m2[arc]['RMS'].values.astype(float)[iter_index]
#     rms_arc = ResidsMeasSumm_m1[arc]['RMS'][iter_index].values.astype(float)

    fig.add_trace(go.Scatter(x=[arc_date],
                             y=[mean_arc],
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col2,
                             size=mark_size,),
                             showlegend=False,
                             ),
                              row=1, col=1,
                             )

for i,arc in enumerate(geodyn_run_params_m3['arc_list']):
    i_arc = i+1
    arc_val = arc[:6]
    arc_date = np.datetime64(datetime.datetime.strptime(arc_val, '%y%m%d'))
    iter_index =  int(resids_meas_summary_m3[arc]['Iter'].values[-1])-1
    mean_arc = resids_meas_summary_m3[arc]['RMS'].values.astype(float)[iter_index]
#     rms_arc = ResidsMeasSumm_m1[arc]['RMS'][iter_index].values.astype(float)

    fig.add_trace(go.Scatter(x=[arc_date],
                             y=[mean_arc],
                             name= 'Arc: '+str(i_arc),
                             mode='markers',
                             marker=dict(color=col3,
                             size=mark_size,),
                             showlegend=False,
                             ),
                              row=1, col=1,
                             )


fig = add_arc_background_w_text(.002)
fig = legend_as_annotation()

fig.update_layout(
    title="RMS- Residual Measurement Summary",
    )
fig.update_yaxes(title_text="RMS", row=1, col=1)
#     fig.update_yaxes(title_text="RMS", row=2, col=1)
fig.update_xaxes(title_text="Arc", row=1, col=1)
#     fig.update_xaxes(title_text="Arc", row=2, col=1)
fig.update_layout(legend= {'itemsizing': 'constant'})
fig.update_xaxes(tickangle=0)

fig.update_layout(
    font=dict(
        size=15,
             ),)


fig.show()

Residuals – station summary

  • I have been told that these are not helpful for assessing density models.

  • unless to identify a really bad station (they can crop up from time to time because of a station anomaly). If one finds a bad station, then one can just delete the data. In this example, it looks like Riyadh (RIYA7832) seems to have a mean bias of 2 cm?. That is a bit large and might be a sign of station coordinate error or some other station-dependent error. One could solve for an MBIAS for that station over the arc to take care of that small problem - you would have to check that an MBIAS for that station is not already adjusted.

TODO: NEED TO ASK FL WHAT EXACTLY THESE DO

[ ]: